Convergence Analysis of Hopfield Neural Networks

نویسنده

  • John J. Hopfield
چکیده

84 Abstract—In this paper, we analyze the convergence and stability properties of Hopfield Neural Networks (HNN). The global convergence and asymptotic stability of HNN have successful various applications in computing and optimization problems. After determining the mathematical model of the network, we do some analysis on the model. This analysis base on Lyapunov Stability Theorem. Firstly, we define positive definite energy function for the system, then investigate the first derivative of this function. According to the Lyapunov Stability Theorem, the first derivative of function must be negative. We do some mathematical processes while proofing the theorem. At the end, we give a sufficient condition in the Theorem-1 for HNN which guarantees the convergence of the system. We give the definition of Lyapunov Stability Theorem and use it for the proof steps. Finally, we give some simulation results for the globally asymptotically stable system. It can be easily observed that the equilibrium point for the system goes to zero. Derived conditions are also supported by the previous results in the literature and the analyses contribute new sufficient condition to the literature.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On global stability of Hopfield neural networks with discontinuous neuron activations

The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard Cellular Neural Networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain....

متن کامل

Hopfield Neural Networks and Self-Stabilization

This paper studies Hopfield neural networks from the perspective of self-stabilizing distributed computation. Known self-stabilization results on Hopfield networks are surveyed. Key ingredients of the proofs are given. Novel applications of self-stabilization—associative memories and optimization—arising from the context of neural networks are discussed. Two new results at the intersection of H...

متن کامل

Exponential Stability of Discrete-Time Hopfield Neural Networks

In this paper, some sufficient conditions for the local and global exponential stability of the discrete-time Hopfield neural networks with general activation functions are derived, which generalize those existing results. By means of M-matrix theory and some inequality analysis techniques, the exponential convergence rate of the neural networks to the equilibrium is estimated, and for the loca...

متن کامل

Estimate of exponential convergence rate and exponential stability for neural networks

Estimate of exponential convergence rate and exponential stability are studied for a class of neural networks which includes the Hopfield neural networks and the cellular neural networks. Both local and global exponential convergence is discussed. Theorems for estimate of exponential convergence rate are established and the bounds on the rate of convergence are given. The domains of attraction ...

متن کامل

On convergence of hopfield neural networks for real time image matching

Present paper demonstrates on innovative approach for a fundamental problem in computer vision to map real time a pixel in one image to a pixel on another image of the same scene, which is generally called image correspondence problem. It is a novel real time image matching method which combines Rotational Invariant Feature Selection for real time images and optimization capabilities of Hopfiel...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011